45 research outputs found

    Ensemble Kalman filter versus ensemble smoother for assessing hydraulic conductivity via tracer test data assimilation

    Get PDF
    Abstract. Estimating the spatial variability of hydraulic conductivity K in natural aquifers is important for predicting the transport of dissolved compounds. Especially in the nonreactive case, the plume evolution is mainly controlled by the heterogeneity of K. At the local scale, the spatial distribution of K can be inferred by combining the Lagrangian formulation of the transport with a Kalman-filter-based technique and assimilating a sequence of time-lapse concentration C measurements, which, for example, can be evaluated on site through the application of a geophysical method. The objective of this work is to compare the ensemble Kalman filter (EnKF) and the ensemble smoother (ES) capabilities to retrieve the hydraulic conductivity spatial distribution in a groundwater flow and transport modeling framework. The application refers to a two-dimensional synthetic aquifer in which a tracer test is simulated. Moreover, since Kalman-filter-based methods are optimal only if each of the involved variables fit to a Gaussian probability density function (pdf) and since this condition may not be met by some of the flow and transport state variables, issues related to the non-Gaussianity of the variables are analyzed and different transformation of the pdfs are considered in order to evaluate their influence on the performance of the methods. The results show that the EnKF reproduces with good accuracy the hydraulic conductivity field, outperforming the ES regardless of the pdf of the concentrations

    Enhancing in situ biodegradation in groundwater using pump and treat remediation: a proof of concept and modelling analysis of controlling variables

    Get PDF
    A remediation approach which uses pump and treatment (PAT) to enhance the biodegradation of organic contaminants by increasing dispersive mixing between plumes and groundwater was evaluated for a phenol-contaminated aquifer, using a reactive transport model which simulates kinetic reactions between an electron donor (ED) in the plume and electron acceptor (EA) in the groundwater. The influence of system design and operation was examined in six modelling scenarios. Injection or extraction of groundwater increases biodegradation above no action and the location, pumping rate, and distance between well(s) are important variables which influence biodegradation. An increase in pumping rate, distance of the wells from the plume centreline, and changing the flow direction increase dispersive mixing between the plume and groundwater. This increases plume spreading and the plume fringe interface, providing a greater flux of dissolved EAs for biodegradation. In general, injection of groundwater containing natural EAs enhances biodegradation more than extraction. The enhancement of biodegradation is sensitive to the relative fluxes of ED and EA, as controlled by the arrangement of the wells. In the best performing scenario, biodegradation was enhanced by 128%, compared with no action

    Well-resolved velocity fields using discontinuous Galerkin shallow water solutions

    Get PDF
    Computational models based on the depth-averaged shallow water equations (SWE) offer an efficient choice to analyse velocity fields around hydraulic structures. Second-order finite volume (FV2) solvers have often been used for this purpose subject to adding an eddy viscosity term at sub-meter resolution, but have been shown to fall short of capturing small-scale field transients emerging from wave-structure interactions. The second-order discontinuous Galerkin (DG2) alternative is significantly more resistant to the growth of numerical diffusion and leads to faster convergence rates. These properties make the DG2 solver a promising modelling tool for detailed velocity field predictions. This paper focuses on exploring this DG2 capability with reference to an FV2 counterpart for a selection of test cases that require well-resolved velocity field predictions. The findings of this work lead to identifying a particular setting for the DG2 solver that allows for obtaining more accurate and efficient depth-averaged velocity fields incorporating small-scale transients

    QUBIC: The QU Bolometric Interferometer for Cosmology

    Get PDF
    One of the major challenges of modern cosmology is the detection of B-mode polarization anisotropies in the CMB. These originate from tensor fluctuations of the metric produced during the inflationary phase. Their detection would therefore constitute a major step towards understanding the primordial Universe. The expected level of these anisotropies is however so small that it requires a new generation of instruments with high sensitivity and extremely good control of systematic effects. We propose the QUBIC instrument based on the novel concept of bolometric interferometry, bringing together the sensitivity advantages of bolometric detectors with the systematics effects advantages of interferometry. Methods: The instrument will directly observe the sky through an array of entry horns whose signals will be combined together using an optical combiner. The whole set-up is located inside a cryostat. Polarization modulation will be achieved using a rotating half-wave plate and interference fringes will be imaged on two focal planes (separated by a polarizing grid) tiled with bolometers. We show that QUBIC can be considered as a synthetic imager, exactly similar to a usual imager but with a synthesized beam formed by the array of entry horns. Scanning the sky provides an additional modulation of the signal and improve the sky coverage shape. The usual techniques of map-making and power spectrum estimation can then be applied. We show that the sensitivity of such an instrument is comparable with that of an imager with the same number of horns. We anticipate a low level of beam-related systematics thanks to the fact that the synthesized beam is determined by the location of the primary horns. Other systematics should be under good control thanks to an autocalibration technique, specific to our concept, that will permit the accurate determination of most of the systematics parameters.Comment: 12 pages, 10 figures, submitted to Astronomy and Astrophysic

    Gaussian Process Modelling for Uncertainty Quantification in Convectively-Enhanced Dissolution Processes in Porous Media

    Get PDF
    Numerical groundwater flow and dissolution models of physico-chemical processes in deep aquifers are usually subject to uncertainty in one or more of the model input parameters. This uncertainty is propagated through the equations and needs to be quantified and characterised in order to rely on the model outputs. In this paper we present a Gaussian process emulation method as a tool for performing uncertainty quantification in mathematical models for convection and dissolution processes in porous media. One of the advantages of this method is its ability to significantly reduce the computational cost of an uncertainty analysis, while yielding accurate results, compared to classical Monte Carlo methods. We apply the methodology to a model of convectively-enhanced dissolution processes occurring during carbon capture and storage. In this model, the Gaussian process methodology fails due to the presence of multiple branches of solutions emanating from a bifurcation point, i.e., two equilibrium states exist rather than one. To overcome this issue we use a classifier as a precursor to the Gaussian process emulation, after which we are able to successfully perform a full uncertainty analysis in the vicinity of the bifurcation point

    Cytology, biochemistry and molecular changes during coffee fruit development

    Full text link
    corecore